Home |
| Latest | About | Random
# Week 1 Wednesday 6/21 brief notes. ## The story so far. Last time we discussed functions, specifically the logarithm and exponential function. We are going to see how to define them later. And since they are inverse functions of each other, we are discussing some basic facts about invertibility (and one-to-one-ness), and how they related to continuity and differentiability: 1. If a function $f$ is one-to-one on its domain $D$, then $f$ has an inverse $f^{-1}$ with domain equal to the range of $f$. 2. A continuous function $f$ on an interval is one-to-one if and only if $f$ is strictly increasing or strictly decreasing. 3. If a differentiable function $f$ on an interval $I$ has derivative $f'(x) > 0$ for all $x \in I$, or $f'(x) < 0$ for all $x \in I$, then $f$ is strictly increasing or strictly decreasing. 4. A differentiable function $f$ on an interval $I$ is strictly increasing or strictly decreasing if and only if $f'(x) \ge 0$ or $f'(x) \le 0$ for all $x\in I$, and that the set $Z=\{x\in I: f'(x)=0\}$ contains no intervals (i.e. $f$ is not constant on some interval). As a point of review, let us see why when a differentiable function $f$ on $I$ with $f'(x) > 0$ for all $x \in I$ is strictly increasing: Take two points $a < b$ in the interval $I$, showing $f$ is strictly increasing is to show $f(a) < f(b)$. Since $f$ is differentiable on the interval $[a,b]$, by the **mean value theorem** there exists a point $c \in (a,b)$ such that $f'(c) = \frac{f(b)-f(a)}{b-a}$. But we have $f'(c) > 0$ as our hypothesis, so this means $\frac{f(b)-f(a)}{b-a} > 0$. As $b > a$, this implies $f(b) - f(a) > 0$, or $f(a) < f(b)$! Ok, let us resume the story of inverse functions. ## Inverses and differentiation. As it turns out, >**Theorem** >Let $f$ be a continuous function on an interval $I$ with range $J$ (necessarily also an interval). If $f$ has an inverse, then its inverse function $f^{-1}$ is continuous on $J$, and has range $I$. This is perhaps not too surprising, since geometrically the graph of $f^{-1}$ is just a mirror image of the graph of $f$, and if we can "draw $f$ without lifting our pencil", we should be able to do the same for $f^{-1}$. Continuing with this geometric intuition, one could imagine if $f$ is differentiable on an interval, and that $f$ has an inverse, then $f^{-1}$ ought to be differentiable as well (think about tangent lines), at least, for most of the part when $f'(x) \neq 0$ (think about what happens if $f'(x)=0$ and the corresponding graphs of $f$ and $f^{-1}$). Further, if $f(a)=b$ and $f^{-1}(b)=a$, then the tangent line of $f$ at $a$ should have slope that is the reciprocal of the slope of the tangent line of $f^{-1}$ at $b$, because as you reflect across the $y=x$ line, you switch $x$ with $y$, so $\frac{\text{rise}}{\text{run}}$ becomes $\frac{\text{run}}{\text{rise}}$. Mathematically, > **Theorem.** > If $f$ is differentiable, and $f(a)=b$, with $f^{-1}(b)=a$, then $$ (f^{-1})'(b)= \frac{1}{f'(a)} $$ provided that $f'(a)\neq 0$. (In particular, $f^{-1}$ is also differentiable at places $f'$ is not zero.) Another way to remember this rule is by chain rule: Note $f(f^{-1})(x)=x$, as they undo each other. So if we differentiate both sides, the LHS is given by chain rule: $$ \frac{d}{dx}(f\circ f^{-1})(x)=f'(f^{-1}(x))\cdot(f^{-1})'(x) $$ while the RHS is just $\frac{d}{dx}(x)=1$. So $$ (f^{-1})'(x) = \frac{1}{f'(f^{-1}(x))} $$ Finally, if $f(a)=b$ and $f^{-1}(b)=a$, we can rewrite above in the form in the theorem. **Example.** Suppose $f(x) = 3x + \sin(x)$ on $\mathbb{R}$. Let us answer some questions. (1) Is $f$ one-to-one on $\mathbb{R}$? $\blacktriangleright$ Yes, notice that $f$ is differentiable on $\mathbb{R}$ and that $f'(x)=3 +\cos(x) > 0$ for all $x\in \mathbb{R}$, so $f$ is strictly increasing and hence $f$ is one-to-one. $\blacklozenge$ (2) Does $f$ have an inverse on $\mathbb{R}$? $\blacktriangleright$ Yes, it is one-to-one, hence $f$ has an inverse on $\mathbb{R}$. $\blacklozenge$ (3) What is $(f^{-1})'(3\pi)$ ? $\blacktriangleright$ To do this, we need to first find out which two points are corresponding to each other. Suppose $f^{-1}(3 \pi)=a$, then $f(a) = 3\pi =3a+\sin(a)$. This means $a=\pi$. So $f^{-1}(3\pi)=\pi$. So if by applying the rule we have, $$ (f^{-1})'(3\pi)= \frac{1}{f'(\pi)}=\frac{1}{3+\cos{\pi}}=\frac{1}{3-1}=\frac{1}{2}. \quad\blacklozenge $$ **Another example.** Consider $f(x) = \int_3^x \sqrt{1+t^3} dt$. (1) Find the largest possible domain $D$ for this function $f$. $\blacktriangleright$ Since the integrand has a square root, we need to ensure the thing inside the square root is positive. So we require $1+t^3 \ge 0$, or $t \ge -1$. Hence the largest possible domain $D$ is $[-1,\infty)$. $\blacklozenge$ (2) Does $f$ have an inverse on this domain $D$? $\blacktriangleright$ Yes, note by using **fundamental theorem of calculus** the derivative $$ f'(x) = \sqrt{1+x^3} \ge 0 $$on $D$, and that $f'(x) = 0$ only at $x=-1$, so the set of zero derivatives $Z = \{-1\}$ contains no interval. Hence $f$ is strictly increasing. Hence $f$ is one-to-one on $D$, and hence $f$ has an inverse on $D$. $\blacklozenge$ (3) Find $(f^{-1})'(0)$. $\blacktriangleright$ To do this, we first find the corresponding pair of points. If $f^{-1}(0)=a$, then $f(a)=0$. This means $$ \int_3 ^a \sqrt{1+t^3}dt=0. $$Using the area interpretation, this can only happen if $a=3$. So $f^{-1}(0)=3$. And by using the rule we have, $$ (f^{-1})'(0)=\frac{1}{f'(3)}= \frac{1}{\sqrt{1+3^3}}=\frac{1}{\sqrt{28}}. \quad \blacklozenge $$ By the way, this example illustrates a function $f$ that is defined out of an **integral**, and this is precisely what we will do to define the natural logarithm. ## The natural logarithm (6.2 $\ast$). We shall now define the function $\ln(x)$. > **Definition.** > Define for all $x > 0$, $$ \ln(x) = \int_1 ^x \frac{1}{t} dt $$ So geometrically $\ln(x)$ is some area counting function, ![[1 teaching/smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-25 11.27.49.excalidraw.svg]] %%[[1 teaching/smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-25 11.27.49.excalidraw|🖋 Edit in Excalidraw]], and the [[smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-25 11.27.49.excalidraw.dark.svg|dark exported image]]%% where $$ \begin{align*} x > 1 & \implies\ln(x) > 0\\ x = 1 & \implies \ln(x)= 0 \\ 0 < x < 1 & \implies \ln(x) < 0 \end{align*} $$ We can estimate the value of $\ln(x)$ geometrically. Say we use **rectangles**, then we see that we can estimate $\ln(5)$ as follows: ![[smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-22 14.21.55.excalidraw.svg]] %%[[smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-22 14.21.55.excalidraw.md|🖋 Edit in Excalidraw]], and the [[smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-22 14.21.55.excalidraw.dark.svg|dark exported image]]%% And that $$ {\color{blue} \frac{1}{2}+\frac{1}{3}+\frac{1}{4}+\frac{1}{5}} < \ln(5) < {\color{red} 1+\frac{1}{2} + \frac{1}{3} + \frac{1}{4} } $$ So in general we have: > **Harmonic estimates.** For any positive integer $n\ge 2$, we have $$ \frac{1}{2} + \frac{1}{3} + \cdots + \frac{1}{n}< \ln(n) < 1 + \frac{1}{2}+\cdots+ \frac{1}{n-1} $$ **This is an important estimate!** If we denote $H_n$ as the $n$-th harmonic number, where $$ H_n = 1 + \frac{1}{2} + \frac{1}{3} + \frac{1}{4} +\cdots + \frac{1}{n} $$then we can re-express above bounds as > $$ H_{n} - 1 < \ln(n) < H_{n-1} $$ for all integers $n \ge 2$. ## Derivatives of $\ln(x)$. Note by fundamental theorem of calculus, > $$ \frac{d}{dx}\ln(x)= \frac{d}{dx} \int_1^x \frac{1}{t}dt =\frac{1}{x} > 0 $$ So this also show $\ln(x)$ is **strictly increasing** on its domain $x > 0$, and hence is one-to-one, and hence has an inverse! Later, the inverse of $\ln(x)$ is defined to be the natural exponentiation function $\exp(x)$. Further more, we can take its second derivative, and see that $$ \left( \frac{d}{dx} \right)^2\ln(x)=\frac{d}{dx}\left( \frac{1}{x} \right) = -\frac{1}{x^2} < 0 $$ on $x > 0$. So $\ln(x)$ has graph that is **concave down**. ## Properties of natural logarithm. Okay, we defined $\ln(x) = \int_1 ^x \frac{1}{t}dt$ on $x > 0$ via an integral. But does it satisfy the properties that we think it should? In particular, would it have the same effect as Napier's log where we have a "product to sum" law? Indeed, we do have: > **Properties of natural logarithm.** > For all positive $x,y > 0$, and rational $r \in \mathbb{Q}$, we have > (1) $\ln(xy) = \ln(x) + \ln(y)$. > (2) $\ln\left( \frac{x}{y} \right)=\ln(x) - \ln(y)$ > (3) $\ln(x^r) = r \ln(x)$. Let us see how to prove (1) here: $\blacktriangleright$ Proof of (1): Let us fix $y > 0$ as a positive constant (so it is not a variable), and consider the function $f(x) = \ln(xy)$ on $x > 0$. Then by chain rule, we have $$ f'(x) = \frac{1}{xy} \cdot \frac{d}{dx}(xy)=\frac{1}{xy} \cdot y = \frac{1}{x} $$ This is interesting, because we also have $\frac{d}{dx}\ln(x) = \frac{1}{x}$ on $x > 0$. That is to say, $f(x)$ and $\ln(x)$ have the **same derivative** on the interval $(0,\infty)$. By the **uniqueness of antiderivatives up to a constant on an interval**, this means $f(x) = \ln(x) + C$ for some constant $C$. To determine this constant, we set $x = 1$, so $f(1)=\ln(1)+C=C$. But $f(1) = \ln(y)$, so $C = \ln(y)$. Hence $f(x)=\ln(x) + \ln(y)$. In other words, $\ln(xy)=\ln(x)+\ln(y)$. Wow! $\blacksquare$ $\blacktriangleright$ To prove (2), first we show what $\ln\left( \frac{1}{y} \right)$ is. Notice that $1 = y\cdot \frac{1}{y}$, so $\ln(1) = \ln\left( y\cdot\frac{1}{y} \right)\stackrel{(1)}{=}\ln(y)+\ln\left( \frac{1}{y} \right)$, so $\ln\left( \frac{1}{y} \right)=-\ln(y)$. Then by applying (1) again, we have $\ln\left( \frac{x}{y} \right)=\ln(x) + \ln\left( \frac{1}{y} \right)=\ln(x)-\ln(y)$. $\blacksquare$ $\blacktriangleright$ To see (3), we do a similar trick as in (1). Fix an $r$ and consider the function $f(x) =\ln(x^r)$. Then by chain rule $f'(x) = \frac{1}{x^r}\cdot \frac{d}{dx}(x^r)=\frac{1}{x^r}rx^{r-1}=\frac{r}{x}$. Next we consider another function $g(x) = r \ln(x)$, and note $g'(x)=\frac{r}{x}$. So $g'(x) =f'(x)$, and again by the **uniqueness of antiderivatives up to a constant**, we have $g(x) = f(x) + C$ for some constant $C$. So $\ln(x^r) = r\ln(x) + C$. If we set $x =1$, we find $C=0$, and we have our identity. $\blacksquare$ ## Limiting behavior of natural logarithm and its graph. We know that $\ln(x)$ has domain $x > 0$ and it is strictly increasing and concave down, but what is its limit at $0^+$ and at $+\infty$? Let us consider the sequence $2^n$. Note that the limit $\ln(2^n)\stackrel{(3)}{=}n\ln(2)\to + \infty$ as $n\to\infty$, and as $\ln(x)$ is strictly increasing, we must also have $$\lim_{x\to+\infty} \ln(x) = +\infty$$ As for the limit when $x\to0^+$, we use a change of variable trick: Set $t=\frac{1}{x}$, then as $x\to0^+$ we have $t\to +\infty$, so $$ \lim_{x\to 0^+} \ln(x) = \lim_{t\to+\infty} \ln\left( \frac{1}{t} \right)=\lim_{t\to+\infty}-\ln(t) = -\infty. $$ So $\ln(x)$ has a graph that goes to negative infinity near $0^+$, is zero at $x=1$, and goes to positive infinity as $x\to+\infty$, it is strictly increasing, and concave down. Here is its graph: ![[1 teaching/smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-22 13.53.42.excalidraw.svg]] %%[[1 teaching/smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-22 13.53.42.excalidraw|🖋 Edit in Excalidraw]], and the [[smc-summer-2023-math-8/notes/week-1/---files/week-1-wednesday-notes 2023-06-22 13.53.42.excalidraw.dark.svg|dark exported image]]%% ## Addendum. Some one in class raised the question: Why do we even care to define $\ln(x)$ as the integral $\int_1 ^x \frac{1}{t} dt$. Certainly there are many reasons, but here I offer you one that I gave in class: One of the first derivative rules you've learned is the **power rule**: $$ \frac{d}{dx}x^n=nx^{n-1} $$This rule is able to take a power function, and gives you another power function. In particular $$ x^{n-1}=\frac{1}{n} \frac{d}{dx}(x^n) $$ Or, $$ x^n = \frac{1}{n+1} \frac{d}{dx}(x^{n+1})= \frac{d}{dx} \left(\frac{x^{n+1}}{n+1}\right) $$ which shows we can get every power of $x$ as the derivative of something, except for $x^{-1}=\frac{1}{x}$. (Note we cannot plug $n=-1$ there). So naturally we ask, is there a function whose derivative is $\frac{1}{x}$? Indeed we have a simple solution to this using the fundamental theorem of calculus: Take $f(x)=\int_1^x \frac{1}{t}dt$ ! By FTC, its derivative is naturally $\frac{1}{x}$. This function is what we call $\ln(x)$. By the way, there are the **two forms** of the fundamental theorem of calculus, many books call them FTC1 and FTC2. I never remember which one is which, but I know what each is doing, so I give them a different adjective instead: > **FTC: Existence form.** > Given any continuous function $f$ on some interval $[a,b]$, then the function $$ g(x)=\int_a^x f(x)dx $$ is continuous on $[a,b]$, differentiable on $(a,b)$, and that $g'(x)=f(x)$. Above is called the **existence** form because we claim the **existence of an antiderivative**, as we can directly "construct" an antiderivative of a suitable continuous function that we like. > **FTC: Uniqueness form.** > Given $f(x)$ continuous on $[a,b]$, if $F(x)$ is **any** antiderivative of $f(x)$, that is $F'(x) = f(x)$, then $$ \int_a^b f(x)dx=F(b)-F(a) $$ Above is called the **uniqueness** form because we can use **any** antiderivative of $f$ to evaluate the definite integral. This is because **any two antiderivatives to the same function are unique up to a constant,** so if there is another $G$ such that $G'=f=F'$, then $F(x)=G(x)+C$, which means $$\begin{align*} \int_a^b f(x)dx &= F(b)-F(a)\\ &=(G(b)+C) - (G(a)+C) \\ &= G(b)-G(a) \end{align*}$$ This shows it doesn't matter which antiderivative you choose! ///